- One of the best pool-cleaning robots I've tested is $450 off for Prime Day
- Apple's M2 MacBook Air is on sale for $749 for Black Friday
- I replaced my desktop with this MSI laptop for a week, and it surpassed my expectations
- AI networking a focus of HPE’s Juniper deal as Justice Department concerns swirl
- 3 reasons why you need noise-canceling earbuds ahead of the holidays (and which models to buy)
Charting the path to a sustainable future with AI for energy efficiency
You can’t greenwash AI. In 2024, organizations will have greater clarity and insights into achieving sustainability outcomes.
In 2024, measurable commitments to sustainability have become table stakes for every business. According to Net Zero Tracker, although more companies than ever are committing to net zero targets, only a small percentage of these meet the United Nations (UN) criteria for reaching the goal.
The UN Race to Zero campaign, which set out revised ‘Starting Line criteria’ in June 2022, asks members to implement immediate emission-cutting measures, set a specific net zero target, include coverage of all greenhouse gases (all emission scopes for companies), apply clear conditions for the use of offsets, publish a plan, and provide annual progress reporting on both interim and longer-term targets.
At the recent COP28 climate summit, almost 200 countries reached a historic consensus and agreed to reduce global consumption of fossil fuels to avert the worst effects of climate change. Effectively hailed as the end of oil, the agreement tasks countries to triple renewable energy capacity globally by 2030, speeding up efforts to reduce coal use and accelerating technologies such as carbon capture and storage that can clean up hard-to-decarbonize industries.
AI’s Sustainability Challenge
However, even with these commitments and technological innovations, energy consumption is expected to rise with the explosive adoption of artificial intelligence (AI). Considered more energy-intensive than other forms of computing, large language models (LLMs) require multiple Graphics Processing Units (GPUs). A single GPU can consume between 250 and 300 watts of power per hour when training an LLM, which requires hundreds of GPUs working together for several days and running without interruption.
For instance, the Megatron-LM, Nvidia’s highly optimized and efficient library for training large language models, used 512 GPUs running for nine days to train its final version, equating to roughly 27,648-kilowatt hours. According to the U.S. Energy Information Administration, a typical American household purchased 10,791 kilowatt hours of energy yearly as of 2022. That means the training of Megatron-LM’s final version used nearly the same amount of energy as two-and-a-half homes annually.
The computing power required to classify, analyze, and respond to AI queries is also exceptionally high, resulting in significant system costs, inefficiencies, and greenhouse gas emissions. This is particularly true for LLMs, such as ChatGPT, which alone has been reported to cost millions of dollars daily to run.
Unlike previous computing booms, training and running LLMs involves a structural cost that remains even after the software has been built or initially trained. Given the billions of calculations required to generate a response to a prompt, these models require massive computing power to run which is much higher than serving web-based applications or pages.
There is a growing demand for higher-performing and less expensive inference AI solutions that can reduce AI’s overall carbon footprint. By creating and putting these higher-efficiency, lower-power solutions into use, we can sustainably address the current and future needs of generative AI and other AI-driven solutions, including fraud detection, translation services, chatbots, and many other current use cases, as well as those yet to be created.
Building energy-efficient AI systems
While inference AI currently accounts for a small percentage of overall energy use, it is growing in popularity to support energy-hungry generative AI apps. Organizations driving adoption and using AI are under pressure to measure and publish data on energy use and sources. Creating and employing a more energy-efficient infrastructure, optimizing models, and implementing software tools and algorithms that track and reduce computational workload during the inference process are critical.
Enterprises employing AI solutions today with current infrastructure can also be more energy efficient by using smaller, more specific models that are purpose-built for specific use cases.
In her annual predictions on coming technology trends for the year ahead, Liz Centoni, Cisco Chief Strategy Officer and GM of Applications, offered insight. “Smaller AI models with fewer layers and filters that are domain-specific account for less energy consumption and costs than general systems.”
“These dedicated systems are trained on smaller, highly accurate data sets and efficiently accomplish specific tasks. In contrast, deep learning models require processing vast amounts of data to achieve results,” she explained.
Smart energy management is also a crucial component to address climate change. According to the Natural Resources Defense Council’s recent Clean Energy Now for a Safer Climate Future: Pathways to Net Zero in the United States by 2050 report, by combining electrification with energy efficiency upgrades, it is possible to reduce building-related fossil fuel consumption and its associated emissions by over 90 percent when compared to current levels.
A new era of energy networking
Among its many promising applications, we see AI unlocking a new era of energy networking and efficiency models. Using advances in energy networking and improved energy efficiency, we can significantly reduce the world’s energy needs by 2050 – and along the way we will be better able to control global emissions of greenhouse gases.
The fast-emerging category of energy networking, which combines software-defined networking capabilities and an electric power system made up of direct current (DC) micro grids, will also contribute to energy efficiency, delivering increased visibility, insights, and automation.
Power over Ethernet, a method to deliver DC power to devices over copper ethernet cabling, eliminates the need for separate power supplies and outlets. A low-voltage solution, it also reduces energy costs by allowing centralized control over lighting, video cameras and monitors, window shades, and heating and cooling, among many other devices found in buildings and homes.
By applying networking to power and connecting it with data, energy networking and Power over Ethernet can provide comprehensive visibility and benchmarking of existing emissions and an access point to optimize power usage, distribution, transmission, and storage, as well as measurement and reporting.
Centoni said these methods will make measuring energy usage and emissions more accurate, automating many functions across IT, smart buildings, and IoT sensors, and unlock inefficient and unused energy:
“With embedded energy management capabilities, the network will become a
control plane for measuring, monitoring, and managing energy consumption.”
—Liz Centoni, Cisco EVP, Chief Strategy Officer, and GM of Applications
Together, these solutions will be a catalyst for vast new AI-powered capabilities without imposing an unsustainable toll on the environment. They can also enable better energy management and storage, allowing companies to meet their increasing energy consumption and sustainability goals.
With AI as both catalyst and canvas for innovation, this is one of a series of blogs exploring Cisco EVP, Chief Strategy Officer, and GM of Applications Liz Centoni’s tech predictions for 2024. Her complete tech trend predictions can be found in The Year of AI Readiness, Adoption and Tech Integration ebook.
Catch the other blogs in the 2024 Tech Trends series.
Share: